Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Whole Word Masking MLM
# Whole Word Masking MLM
Bert Base Japanese Char V2
BERT model pre-trained on Japanese text using character-level tokenization and whole word masking, trained on the Japanese Wikipedia version as of August 31, 2020
Large Language Model
Japanese
B
tohoku-nlp
134.28k
6
Featured Recommended AI Models
Empowering the Future, Your AI Solution Knowledge Base
English
简体中文
繁體中文
にほんご
© 2025
AIbase